AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Distillation Training

# Distillation Training

Deit Small Distilled Patch16 224
Apache-2.0
The distilled DeiT model was pre-trained and fine-tuned on ImageNet-1k at 224x224 resolution, learning from a teacher CNN using distillation methods
Image Classification Transformers
D
facebook
2,253
6
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase